September 1989 LIDS - P - 1910 On Metric Entropy , Vapnik - Chervonenkis Dimension , and Learnability for a Class of Distributions 1

نویسنده

  • Sanjeev R. Kulkarni
چکیده

In [23], Valiant proposed a formal framework for distribution-free concept learning which has generated a great deal of interest. A fundamental result regarding this framework was proved by Blumer et al. [6] characterizing those concept classes which are learnable in terms of their Vapnik-Chervonenkis (VC) dimension. More recently, Benedek and Itai [4] studied learnability with respect to a fixed probability distribution (a variant of the original distribution-free framework) and proved an analogous result characterizing learnability in this case. They also stated a conjecture regarding learnability for a class of distributions. In this report, we first point out that the condition for learnability obtained in [4] is equivalent to the notion of finite metric entropy (which has been studied in other contexts). Some relationships, in addition to those shown in [4], between the VC dimension of a concept class and its metric entropy with respect to various distributions are then discussed. Finally, we prove some partial results regarding learnability for a class of distributions. 1 Supported by the U.S. Army Research Office contract DAAL03-86-K-0171 (Center for Intelligent Control Systems) and by the Department of the Navy for SDIO. 2 Center for Intelligent Control Systems, M.I.T., 77 Massachusetts Ave., 35-423, Cambridge, MA, 02139 and M.I.T. Lincoln Laboratory, 244 Wood St., Lexington, MA 02173.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Relating Data Compression and Learnability

We explore the learnability of two-valued functions from samples using the paradigm of Data Compression. A first algorithm (compression) choses a small subset of the sample which is called the kernel. A second algorithm predicts future values of the function from the kernel, i.e. the algorithm acts as an hypothesis for the function to be learned. The second algorithm must be able to reconstruct...

متن کامل

Metric Entropy and Minimax Risk in Classification

We apply recent results on the minimax risk in density estimation to the related problem of pattern classiication. The notion of loss we seek to minimize is an information theoretic measure of how well we can predict the classiication of future examples, given the classiication of previously seen examples. We give an asymptotic characterization of the minimax risk in terms of the metric entropy...

متن کامل

Metric Entropy and Minimax Risk in Classi cation

We apply recent results on the minimax risk in density esti mation to the related problem of pattern classi cation The notion of loss we seek to minimize is an information theoretic measure of how well we can predict the classi cation of future examples given the classi cation of previously seen examples We give an asymptotic characterization of the minimax risk in terms of the metric entropy p...

متن کامل

Results on Learnability and the Vapnik-Chervonenkis Dimension*

We consider the problem of learning a concept from examples in the distributionfree model by Valiant. (An essentially equivalent model, if one ignores issues of computational difficulty, was studied by Vapnik and Chervonenkis.) We introduce the notion of dynamic sampling, wherein the number of examples examined may increase with the complexity of the target concept. This method is used to estab...

متن کامل

Results on learnability and the Vapnik-Chervonenkis dimension (Extended Abstract)

We consider the problem of learning a concept from examples in the distributionfree model by Valiant. (An essentially equivalent model, if one ignores issues of computational difficulty, was studied by Vapnik and Chervonenkis.) We introduce the notion of dynamic sampling, wherein the number of examples examined may increase with the complexity of the target concept. This method is used to estab...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1989